A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants
نویسندگان
چکیده
Simulating from distributions with intractable normalizing constants has been a long-standing problem in machine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. The MCMH algorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals.
منابع مشابه
Monte Carlo Methods on Bayesian Analysis of Constrained Parameter Problems with Normalizing Constants
Constraints on the parameters in a Bayesian hierarchical model typically make Bayesian computation and analysis complicated. As Gelfand, Smith and Lee (1992) remarked, it is almost impossible to sample from a posterior distribution when its density contains analytically intractable integrals (normalizing constants) that depend on the (hyper) parameters. Therefore, the Gibbs sampler or the Metro...
متن کاملSequentially Interacting Markov Chain Monte Carlo Methods
We introduce a novel methodology for sampling from a sequence of probability distributions of increasing dimension and estimating their normalizing constants. These problems are usually addressed using Sequential Monte Carlo (SMC) methods. The alternative Sequentially Interacting Markov Chain Monte Carlo (SIMCMC) scheme proposed here works by generating interacting non-Markovian sequences which...
متن کاملMiscellanea An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants
Maximum likelihood parameter estimation and sampling from Bayesian posterior distributions are problematic when the probability density for the parameter of interest involves an intractable normalising constant which is also a function of that parameter. In this paper, an auxiliary variable method is presented which requires only that independent samples can be drawn from the unnormalised densi...
متن کاملAdvances in Markov chain Monte Carlo methods
Probability distributions over many variables occur frequently in Bayesian inference, statistical physics and simulation studies. Samples from distributions give insight into their typical behavior and can allow approximation of any quantity of interest, such as expectations or normalizing constants. Markov chain Monte Carlo (MCMC), introduced by Metropolis et al. (1953), allows sampling from d...
متن کاملMCMC for Doubly-intractable Distributions
Markov Chain Monte Carlo (MCMC) algorithms are routinely used to draw samples from distributions with intractable normalization constants. However, standard MCMC algorithms do not apply to doublyintractable distributions in which there are additional parameter-dependent normalization terms; for example, the posterior over parameters of an undirected graphical model. An ingenious auxiliary-varia...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural computation
دوره 25 8 شماره
صفحات -
تاریخ انتشار 2013